Gaussian Maximum of Entropy and Reversed Log-sobolev Inequality
نویسنده
چکیده
The aim of this note is to connect a reversed form of the Gross logarithmic Sobolev inequality with the Gaussian maximum of Shannon’s entropy power. There is thus a complete parallel with the well-known link between logarithmic Sobolev inequalities and their information theoretic counterparts. We moreover provide an elementary proof of the reversed Gross inequality via a two-point inequality and the Central Limit Theorem. 1. Shannon’s entropy power and Gross’s inequality In the sequel, we denote by Entμ(f) the entropy of a non-negative integrable function f with respect to a positive measure μ, defined by Entμ(f) := ∫ f log fdμ − ∫ fdμ log ∫ fdμ. The Shannon entropy [15] of an n-variate random vector X with probability density function (pdf) f is given by H(X) := −Entλn(f) = − ∫ f log f dx, where dx denotes the n-dimensional Lebesgue measure on Rn. The Shannon entropy power [15] of X is then given by N(X) := 1 2πe exp ( 2 n H(X) ) . It is well-known (cf. [15, 8]) that Gaussians saturates this entropy at fixed covariance. Namely, for any n-variate random vector X with covariance matrix K(X), one have N(X) ≤ |K(X)|, (1) and |K| is the entropy power of the n-dimensional Gaussian with covariance K. The logarithmic Sobolev inequality of Gross [11] expresses that for any nonnegative smooth function f : Rn → R+ 2Entγn(f) ≤ Eγn ( |∇f | f )
منابع مشابه
Generalization of an Inequality by Talagrand, and Links with the Logarithmic Sobolev Inequality
We show that transport inequalities, similar to the one derived by Talagrand [30] for the Gaussian measure, are implied by logarithmic Sobolev inequalities. Conversely, Talagrand’s inequality implies a logarithmic Sobolev inequality if the density of the measure is approximately log-concave, in a precise sense. All constants are independent of the dimension, and optimal in certain cases. The pr...
متن کاملLogarithmic Sobolev Inequalities and Spectral Gaps
We prove an simple entropy inequality and apply it to the problem of determining log– Sobolev constants.
متن کاملEntropy Jumps for Radially Symmetric Random Vectors
We establish a quantitative bound on the entropy jump associated to the sum of independent, identically distributed (IID) radially symmetric random vectors having dimension greater than one. Following the usual approach, we first consider the analogous problem of Fisher information dissipation, and then integrate along the Ornstein-Uhlenbeck semigroup to obtain an entropic inequality. In a depa...
متن کاملModified Logarithmic Sobolev Inequalities in Discrete Settings
Motivated by the rate at which the entropy of an ergodic Markov chain relative to its stationary distribution decays to zero, we study modified versions of logarithmic Sobolev inequalities in the discrete setting of finite Markov chains and graphs. These inequalities turn out to be weaker than the standard log-Sobolev inequality, but stronger than the Poincare’ (spectral gap) inequality. We sho...
متن کاملFunctional Inequalities for Gaussian and Log-Concave Probability Measures
We give three proofs of a functional inequality for the standard Gaussian measure originally due to William Beckner. The first uses the central limit theorem and a tensorial property of the inequality. The second uses the Ornstein-Uhlenbeck semigroup, and the third uses the heat semigroup. These latter two proofs yield a more general inequality than the one Beckner originally proved. We then ge...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2007